Advancing the size and complexity of neural network models leads to an everincreasing demand for computational resources for their simulation.Neuromorphic devices offer a number of advantages over conventional computingarchitectures, such as high emulation speed or low power consumption, but thisusually comes at the price of reduced configurability and precision. In thisarticle, we investigate the consequences of several such factors that arecommon to neuromorphic devices, more specifically limited hardware resources,limited parameter configurability and parameter variations. Our final aim is toprovide an array of methods for coping with such inevitable distortionmechanisms. As a platform for testing our proposed strategies, we use anexecutable system specification (ESS) of the BrainScaleS neuromorphic system,which has been designed as a universal emulation back-end for neuroscientificmodeling. We address the most essential limitations of this device in detailand study their effects on three prototypical benchmark network models within awell-defined, systematic workflow. For each network model, we start by definingquantifiable functionality measures by which we then assess the effects oftypical hardware-specific distortion mechanisms, both in idealized softwaresimulations and on the ESS. For those effects that cause unacceptabledeviations from the original network dynamics, we suggest generic compensationmechanisms and demonstrate their effectiveness. Both the suggested workflow andthe investigated compensation mechanisms are largely back-end independent anddo not require additional hardware configurability beyond the one required toemulate the benchmark networks in the first place. We hereby provide a genericmethodological environment for configurable neuromorphic devices that aretargeted at emulating large-scale, functional neural networks.
展开▼